Simple examples for the failure of Newton's method with line search for strictly convex minimization
نویسندگان
چکیده
In this paper two simple examples of a twice continuously differentiable strictly convex function f are presented for which Newton’s method with line search converges to a point where the gradient of f is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function f is defined as well as a sequence of descent directions for which exact line searches do not converge to the minimizer of f . Then f is perturbed such that these search directions coincide with the Newton directions for the perturbed function while leaving the exact line search invariant.
منابع مشابه
On the divergence of line search methods
We discuss the convergence of line search methods for minimization. We explain how Newton’s method and the BFGS method can fail even if the restrictions of the objective function to the search lines are strictly convex functions, the level sets of the objective functions are compact, the line searches are exact and the Wolfe conditions are satisfied. This explanation illustrates a new way to co...
متن کاملBound constrained quadratic programming via piecewise quadratic functions
We consider the strictly convex quadratic programming problem with bounded variables. A dual problem is derived using Lagrange duality. The dual problem is the minimization of an unconstrained, piecewise quadratic function. It involves a lower bound of λ1, the smallest eigenvalue of a symmetric, positive definite matrix, and is solved by Newton iteration with line search. The paper describes th...
متن کاملA Gauss - Newton method for convex composite optimization 1
An extension of the Gauss-Newton method for nonlinear equations to convex composite optimization is described and analyzed. Local quadratic convergence is established for the minimization of h o F under two conditions, namely h has a set of weak sharp minima, C, and there is a regular point of the inclusion F(x) E C. This result extends a similar convergence result due to Womersley (this journa...
متن کاملRegularized Newton Methods for Convex Minimization Problems with Singular Solutions
This paper studies convergence properties of regularized Newton methods for minimizing a convex function whose Hessian matrix may be singular everywhere. We show that if the objective function is LC2, then the methods possess local quadratic convergence under a local error bound condition without the requirement of isolated nonsingular solutions. By using a backtracking line search, we globaliz...
متن کاملSIZE AND GEOMETRY OPTIMIZATION OF TRUSS STRUCTURES USING THE COMBINATION OF DNA COMPUTING ALGORITHM AND GENERALIZED CONVEX APPROXIMATION METHOD
In recent years, the optimization of truss structures has been considered due to their several applications and their simple structure and rapid analysis. DNA computing algorithm is a non-gradient-based method derived from numerical modeling of DNA-based computing performance by new computers with DNA memory known as molecular computers. DNA computing algorithm works based on collective intelli...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Math. Program.
دوره 158 شماره
صفحات -
تاریخ انتشار 2016